Progressive multi-level distillation learning for pruning network

نویسندگان

چکیده

Abstract Although the classification method based on deep neural network has achieved excellent results in tasks, it is difficult to apply real-time scenarios because of high memory footprints and prohibitive inference times. Compared unstructured pruning, structured pruning techniques can reduce computation cost model runtime more effectively, but inevitably reduces precision model. Traditional methods use fine tuning restore damage performance. However, there still a large gap between pruned original one. In this paper, we progressive multi-level distillation learning compensate for loss caused by pruning. Pre-pruning post-pruning networks serve as teacher student networks. The proposed approach utilizes complementary properties knowledge distillation, which allows learn intermediate output representations network, thus reducing influence subject Experiments demonstrate that our performs better CIFAR-10, CIFAR-100, Tiny-ImageNet datasets with different rates. For instance, GoogLeNet achieve near lossless CIFAR-10 dataset 60% Moreover, paper also proves using during process achieves significant performance gains than after completing

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Progressive Reinforcement Learning with Distillation for Multi-Skilled Motion Control

Deep reinforcement learning has demonstrated increasing capabilities for continuous control problems, including agents that can move with skill and agility through their environment. An open problem in this setting is that of developing good strategies for integrating or merging policies for multiple skills, where each individual skill is a specialist in a specific skill and its associated stat...

متن کامل

Pruning Techniques for Multi-Objective System-Level Design Space Exploration

Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: http://uba.uva.nl/en/contact, or a letter to: Library of ...

متن کامل

RUM: network Representation learning throUgh Multi-level structural information preservation

We have witnessed the discovery of many techniques for network representation learning in recent years, ranging from encoding the context in random walks to embedding the lower order connections, to finding latent space representations with auto-encoders. However, existing techniques are looking mostly into the local structures in a network, while higherlevel properties such as global community...

متن کامل

Browsing hierarchical data with multi-level dynamic queries and pruning

please address correspondence to Ben Shneiderman To appear in International Journal of Human-Computer Studies

متن کامل

Unsupervised Learning for Information Distillation

Current document archives are enormously large and constantly increasing and that makes it practically impossible to make use of them efficiently. To analyze and interpret large volumes of speech and text of these archives in multiple languages and produce structured information of interest to its user, information distillation techniques are used. In order to access the key information in resp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Complex & Intelligent Systems

سال: 2023

ISSN: ['2198-6053', '2199-4536']

DOI: https://doi.org/10.1007/s40747-023-01036-0